# Linear Attention
Ring Lite Linear Preview
MIT
The Linglong Linear Preview is a hybrid linear sparse large language model open-sourced by InclusionAI, with a total of 17.1B parameters and 3.0B activated parameters. This model implements long-text reasoning based on a hybrid linear attention mechanism, achieving near-linear computational complexity and near-constant space complexity during inference.
Large Language Model Supports Multiple Languages
R
inclusionAI
25
8
Molformer XL Both 10pct
Apache-2.0
MoLFormer is a chemical language model pre-trained on 1.1 billion molecular SMILES strings from ZINC and PubChem. This version uses 10% samples from each dataset for training.
Molecular Model
Transformers

M
ibm-research
171.96k
19
Featured Recommended AI Models